Distributed Digital Data: Keeping files consistent, timely and small
نویسندگان
چکیده
Digitalization of information across organizations has made information exchange between organizations easier. Information exchange, however, requires governmental organizations to define security policies, stating which information can be accessed, processed and indexed by which organizations, when, where and how. Ensuring that the information, once exchanged, stays up-to-date is a real challenge, as is enforcement of the security policies. This paper proposes the use of Distributed Digital Dossier in combination with agent technology to enforce these requirements. The domain of a Court of Law is taken as an example to illustrate the approach.
منابع مشابه
Gis-based Hydrologic Modeling: the Automated Geospatial Watershed Assessment Tool
based model describing the processes of interception, infiltration, surface runoff, and erosion from small agricultural and urban watersheds (Smith et al., 1995). In this model, watersheds are represented by subdividing contributing areas into a cascade of one-dimensional overland flow and channel elements using topographic information. A broadly updated version of KINEROS, KINEROS2 may be used...
متن کاملThe Automated Geospatial Watershed Assessment tool
based model describing the processes of interception, infiltration, surface runoff, and erosion from small agricultural and urban watersheds (Smith et al., 1995). In this model, watersheds are represented by subdividing contributing areas into a cascade of one-dimensional overland flow and channel elements using topographic information. A broadly updated version of KINEROS, KINEROS2 may be used...
متن کاملLH* Schemes with Scalable Availability
Modern applications increasingly require scalable, highly available and distributed storage systems. High-availability schemes typically deliver data despite up to n ≥ 1 simultaneous unavailabilities of the storage nodes (disks, processors with storage, or entire computers), where n is fixed. Such schemes are insufficient for scalable files, since the probability of more than n failures increas...
متن کاملAn Efficient Approach to Optimize the Performance of Massive Small Files in Hadoop MapReduce Framework
The most popular open source distributed computing framework called Hadoop was designed by Doug Cutting and his team, which involves thousands of nodes to process and analyze huge amounts of data called Big Data. The major core components of Hadoop are HDFS (Hadoop Distributed File System) and MapReduce. This framework is the most popular and powerful for store, manage and process Big Data appl...
متن کاملA Hardware/Software Cosynthesis System for Digital Signal Processor Cores with Two Types of Register Files
In digital signal processing, bit width of intermediate variables should be longer than that of input and output variables in order to execute intermediate operations with high precision. Then a processor core for digital signal processing is required to have two types of register files, one of which is used by input and output variables and the other one is used by intermediate variables. This...
متن کامل